Communication-Efficient Distributed Cooperative Learning With Compressed Beliefs

نویسندگان

چکیده

In this article, we study the problem of distributed cooperative learning, where a group agents seeks to agree on set hypotheses that best describes sequence private observations. scenario is large, propose belief update rule share compressed (either sparse or quantized) beliefs with an arbitrary positive compression rate. Our algorithm leverages unified communication enables access wide-ranging operators as black-box modules. We prove almost sure asymptotic convergence optimal hypotheses. Additionally, show nonasymptotic, explicit, and linear concentration rate in probability hypothesis set. provide numerical experiments illustrate benefits our method. The simulation results number transmitted bits can be reduced 5%–10% noncompressed method studied scenarios.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Communication-Efficient Distributed Learning of Discrete Distributions

We initiate a systematic investigation of distribution learning (density estimation) when the data is distributed across multiple servers. The servers must communicate with a referee and the goal is to estimate the underlying distribution with as few bits of communication as possible. We focus on non-parametric density estimation of discrete distributions with respect to the `1 and `2 norms. We...

متن کامل

Communication Efficient Distributed Machine Learning with the Parameter Server

This paper describes a third-generation parameter server framework for distributed machine learning. This framework offers two relaxations to balance system performance and algorithm efficiency. We propose a new algorithm that takes advantage of this framework to solve non-convex non-smooth problems with convergence guarantees. We present an in-depth analysis of two large scale machine learning...

متن کامل

Efficient Distributed Learning with Sparsity

We propose a novel, efficient approach for distributed sparse learning with observations randomly partitioned across machines. In each round of the proposed method, worker machines compute the gradient of the loss on local data and the master machine solves a shifted `1 regularized loss minimization problem. After a number of communication rounds that scales only logarithmically with the number...

متن کامل

Efficient Semantic Communication via Compatible Beliefs

In previous works, Juba and Sudan [6] and Goldreich, Juba and Sudan [4] considered the idea of “semantic communication”, wherein two players, a user and a server, attempt to communicate with each other without any prior common language (or communication) protocol. They showed that if communication was goal-oriented and the user could sense progress towards the goal (or verify when it has been a...

متن کامل

A Distributed Frank-Wolfe Algorithm for Communication-Efficient Sparse Learning

Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Control of Network Systems

سال: 2022

ISSN: ['2325-5870', '2372-2533']

DOI: https://doi.org/10.1109/tcns.2022.3198791